1,375 research outputs found

    Multilayered Nanostructures Integrated with Emerging Technologies

    Get PDF
    Surface and interface functionalization are crucial steps to introduce new functionalities in numerous applications, as faster dynamics occur on surfaces rather than bulk. Within this context, the layer-by-layer (LbL) technique is a versatile methodology to controllably form organized nanostructures from the spontaneous adsorption of charged molecules. It enables the assembly of multilayered LbL films on virtually any surface using non-covalent molecular interactions, allowing the nanoengineering of interfaces and creation of multifunctional systems with distinct building blocks (polymers, clays, metal nanoparticles, enzymes, organic macromolecules, etc.). Several applications require thin films on electrodes for sensing/biosensing, and here we explore LbL films deposited on interdigitated electrodes (IDEs) that were 3D-printed using the fusing deposition modeling (FDM) technique. IDEs covered with LbL films can be used to form multisensory systems employed in the analysis of complex liquids transforming raw data into specific patterns easily recognized by computational and statistical methods. We extend the FDM 3D-printing methodology to simplify the manufacturing of electrodes and microchannels, thus integrating an e-tongue system in a microfluidic device. Moreover, the continuous flow within microchannels contributes to faster and more accurate analysis, reducing the amount of sample, waste, and costs

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    Search for heavy resonances decaying to two Higgs bosons in final states containing four b quarks

    Get PDF
    A search is presented for narrow heavy resonances X decaying into pairs of Higgs bosons (H) in proton-proton collisions collected by the CMS experiment at the LHC at root s = 8 TeV. The data correspond to an integrated luminosity of 19.7 fb(-1). The search considers HH resonances with masses between 1 and 3 TeV, having final states of two b quark pairs. Each Higgs boson is produced with large momentum, and the hadronization products of the pair of b quarks can usually be reconstructed as single large jets. The background from multijet and t (t) over bar events is significantly reduced by applying requirements related to the flavor of the jet, its mass, and its substructure. The signal would be identified as a peak on top of the dijet invariant mass spectrum of the remaining background events. No evidence is observed for such a signal. Upper limits obtained at 95 confidence level for the product of the production cross section and branching fraction sigma(gg -> X) B(X -> HH -> b (b) over barb (b) over bar) range from 10 to 1.5 fb for the mass of X from 1.15 to 2.0 TeV, significantly extending previous searches. For a warped extra dimension theory with amass scale Lambda(R) = 1 TeV, the data exclude radion scalar masses between 1.15 and 1.55 TeV

    Measurement of the top quark mass using charged particles in pp collisions at root s=8 TeV

    Get PDF
    Peer reviewe

    Search for supersymmetry in events with one lepton and multiple jets in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe

    Measurement of the Splitting Function in &ITpp &ITand Pb-Pb Collisions at root&ITsNN&IT=5.02 TeV

    Get PDF
    Data from heavy ion collisions suggest that the evolution of a parton shower is modified by interactions with the color charges in the dense partonic medium created in these collisions, but it is not known where in the shower evolution the modifications occur. The momentum ratio of the two leading partons, resolved as subjets, provides information about the parton shower evolution. This substructure observable, known as the splitting function, reflects the process of a parton splitting into two other partons and has been measured for jets with transverse momentum between 140 and 500 GeV, in pp and PbPb collisions at a center-of-mass energy of 5.02 TeV per nucleon pair. In central PbPb collisions, the splitting function indicates a more unbalanced momentum ratio, compared to peripheral PbPb and pp collisions.. The measurements are compared to various predictions from event generators and analytical calculations.Peer reviewe

    Development and validation of HERWIG 7 tunes from CMS underlying-event measurements

    Get PDF
    This paper presents new sets of parameters (“tunes”) for the underlying-event model of the HERWIG7 event generator. These parameters control the description of multiple-parton interactions (MPI) and colour reconnection in HERWIG7, and are obtained from a fit to minimum-bias data collected by the CMS experiment at s=0.9, 7, and 13Te. The tunes are based on the NNPDF 3.1 next-to-next-to-leading-order parton distribution function (PDF) set for the parton shower, and either a leading-order or next-to-next-to-leading-order PDF set for the simulation of MPI and the beam remnants. Predictions utilizing the tunes are produced for event shape observables in electron-positron collisions, and for minimum-bias, inclusive jet, top quark pair, and Z and W boson events in proton-proton collisions, and are compared with data. Each of the new tunes describes the data at a reasonable level, and the tunes using a leading-order PDF for the simulation of MPI provide the best description of the dat

    Measurement of nuclear modification factors of gamma(1S)), gamma(2S), and gamma(3S) mesons in PbPb collisions at root s(NN)=5.02 TeV

    Get PDF
    The cross sections for ϒ(1S), ϒ(2S), and ϒ(3S) production in lead-lead (PbPb) and proton-proton (pp) collisions at √sNN = 5.02 TeV have been measured using the CMS detector at the LHC. The nuclear modification factors, RAA, derived from the PbPb-to-pp ratio of yields for each state, are studied as functions of meson rapidity and transverse momentum, as well as PbPb collision centrality. The yields of all three states are found to be significantly suppressed, and compatible with a sequential ordering of the suppression, RAA(ϒ(1S)) > RAA(ϒ(2S)) > RAA(ϒ(3S)). The suppression of ϒ(1S) is larger than that seen at √sNN = 2.76 TeV, although the two are compatible within uncertainties. The upper limit on the RAA of ϒ(3S) integrated over pT, rapidity and centrality is 0.096 at 95% confidence level, which is the strongest suppression observed for a quarkonium state in heavy ion collisions to date. © 2019 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). Funded by SCOAP3.Peer reviewe

    Search for new particles in events with energetic jets and large missing transverse momentum in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for new particles produced at the LHC in proton-proton collisions at root s = 13 TeV, using events with energetic jets and large missing transverse momentum. The analysis is based on a data sample corresponding to an integrated luminosity of 101 fb(-1), collected in 2017-2018 with the CMS detector. Machine learning techniques are used to define separate categories for events with narrow jets from initial-state radiation and events with large-radius jets consistent with a hadronic decay of a W or Z boson. A statistical combination is made with an earlier search based on a data sample of 36 fb(-1), collected in 2016. No significant excess of events is observed with respect to the standard model background expectation determined from control samples in data. The results are interpreted in terms of limits on the branching fraction of an invisible decay of the Higgs boson, as well as constraints on simplified models of dark matter, on first-generation scalar leptoquarks decaying to quarks and neutrinos, and on models with large extra dimensions. Several of the new limits, specifically for spin-1 dark matter mediators, pseudoscalar mediators, colored mediators, and leptoquarks, are the most restrictive to date.Peer reviewe
    corecore